3-Information Theory-Information

information and theory

Positions have a finite number of possible states {information}. Positions can be static, as in memories or registers, or moving, as in information channels. Mathematical rules {information theory, data} describe storing, retrieving, and transmitting data.

information extraction from data

States differ from other states, so information extraction at locations notes differences, rather than measuring amounts. Information is any difference, change, or possible-set selection.

Sampling theorems, such as Logan's zero-crossing theorem, describe how to extract information from data.

probability

States have probabilities of being at locations. If location has states at random, there is no information, even if states have known transitions. Non-random conditional probability is information.

system

Finite systems have finite numbers of elements, which have finite numbers of states. Systems are information spaces, and distributions are information sets. Highest probability has the most possible states. Some outputs are typically more probable than others.

dependence

Difference between sum of independent entropies and actual system entropy measures dependence. System subsets can depend on whole system {mutual information}.

data

Memories, registers, and information flows have state series {data}.

context

Preceding, following, adjacent, and related states define state environment {context, state} {data context}. Information meaning comes from context. Contexts have codes or contrasts. Syntax defines context relations.

code

Contexts have possible-symbol sets {code} {contrast, data}. Symbols have probabilities of being in contexts, which are information amounts.

3-Information Theory-Information-Unit

bit of information

The smallest information amount {bit, information} involves one position that can have two equally probable states, so states have probability 1/2. If one position has one possible state, state probability is 1, but this situation has no differences and no information. If one position has three equally probable states, states have probability 1/3, requiring 1.5 information bits. If one position has four equally probable states, states have probability 1/4, requiring two information bits. If two positions each have two equally probable states, pairs have probability 1/4, requiring two information bits.

quantum bit

Smallest quantum-information amount {quantum bit}| {qubit} involves 0 and 1 superposition.

model

Sphere points, with 0 and 1 at pole, can represent superposition. Sphere points have probabilities of obtaining 0 or 1 at decoherence.

information

Qubits have one quantum information bit {Holevo chi}, because output is 0 or 1 after decoherence. This information bit is quantum equivalent of information-theory information bit (Shannon).

entanglement

Quantum particles can be in systems with overall quantum states, so quantum-particle states interact by entanglement.

decoherence

Isolated systems can maintain quantum states, as in superconductivity and quantum Hall effect. Measurements, gravity, and other interactions with larger systems can collapse wavefunctions and cause wave decoherence.

uses

Quantum states can teleport, because entanglement can transfer to another quantum system. Quantum states can use entanglement for cryptography keys. Quantum-mechanical computers use entangled qubits. Quantum computers can find integer prime factors in same way as finding quantum-system energy levels. Quantum error correction can eliminate noise and random interactions of quantum system with environment, by correcting states without knowing what they are. However, unknown-state quantum bit cannot duplicate.

information channel

Two ensembles can link on paths {channel, information} {information channel} {communication channel} that carry information. Information channel transmits output sets, for information categories. Information-channel receiver knows output set, how to process outputs, how to correct errors, and how to mitigate noise. Communication-channel input transforms into output, typically by selecting from available outputs. Physical information channels use frequency ranges, directions, times, and physical media.

bandwidth

Main frequency limits higher frequency or amplitude range {bandwidth}|.

carrier frequency

Main frequency {carrier frequency}| can carry information. Information can be in higher frequencies superimposed on main frequency {frequency modulation, data}. Information can be in main-frequency amplitude variations {amplitude modulation, data}.

channel capacity

Channels can carry numbers of bits each second {channel capacity}|. Channel capacity depends on carrier frequency. Higher frequencies can carry more information. Channel capacity depends on carrying method. Older methods are amplitude modulation and frequency modulation.

Related Topics in Table of Contents

3-Information Theory

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0225